Always in the shadows.
2019-01-25
https://gyazo.com/94771e8fc3c1fdae46962ed3d5b9fb9a
Vector -> Linear Algebra -> Machine Learning I think, but after learning machine learning without understanding the translation, I have a better understanding of linear algebra.
In the model I wrote in [Principle that the learning curve is an S-curve
Acquiring abstract knowledge makes it easier to acquire the concrete knowledge that lies beneath it.
written as
Is this true?
Is the act of "learning machine learning without understanding the translation" really acquiring knowledge directly at a higher level of abstraction?
When you are learning without understanding, are you really learning anything near the top of the pyramid?
Rather, we are learning about shadows on the ground, aren't we?
The tower casts a flat shadow on the ground.
High abstractions have information that corresponds to concrete layers of lower abstraction.
By learning this less abstract information in a way that we don't understand, we accumulate knowledge that "A and B are connected.
When you eventually understand A, A, which was placed on the ground, is lifted higher, and B and C are also pulled upward
2023-07-11
After four years, they finally bonded.
---
This page is auto-translated from /nishio/常に影がある using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.